Search results for "Conditional mutual information"

showing 5 items of 5 documents

Wiener-Granger Causality in Network Physiology with Applications to Cardiovascular Control and Neuroscience

2016

Since the operative definition given by C. W. J. Granger of an idea expressed by N. Wiener, the Wiener–Granger causality (WGC) has been one of the most relevant concepts exploited by modern time series analysis. Indeed, in networks formed by multiple components, working according to the notion of segregation and interacting with each other according to the principle of integration, inferring causality has opened a window on the effective connectivity of the network and has linked experimental evidences to functions and mechanisms. This tutorial reviews predictability improvement, information-based and frequency domain methods for inferring WGC among physiological processes from multivariate…

nonlinear dynamicComputer scienceReliability (computer networking)Biomedical signal processingPhysiologyCardiovascular controldynamical systemdirectionalityGranger causalitymultivariate regression modelingtime series analysiPredictabilityTime seriesElectrical and Electronic EngineeringStatistical hypothesis testingbusiness.industryheart rate variabilitytransfer entropypartial directed coherencepredictioncoupling strengthCausalityconditional mutual informationFrequency domainspectral decompositionSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaArtificial intelligencebusinesscomplexityNeuroscience
researchProduct

Information decomposition of multichannel EMG to map functional interactions in the distributed motor system

2019

AbstractThe central nervous system needs to coordinate multiple muscles during postural control. Functional coordination is established through the neural circuitry that interconnects different muscles. Here we used multivariate information decomposition of multichannel EMG acquired from 14 healthy participants during postural tasks to investigate the neural interactions between muscles. A set of information measures were estimated from an instantaneous linear regression model and a time-lagged VAR model fitted to the EMG envelopes of 36 muscles. We used network analysis to quantify the structure of functional interactions between muscles and compared them across experimental conditions. Co…

MaleInformation transferMuscle networkNeurologyTransfer entropyComputer scienceSocial SciencesPostural controlFunctional connectivity0302 clinical medicineCONNECTIVITYNeural PathwaysDecomposition (computer science)Medicine and Health Sciencesmotor controlMuscle activityPostural Balance0303 health sciencesMuscle networksConditional mutual information05 social sciencesmedicine.anatomical_structureNeurologySYNCHRONIZATIONFemaleSpinal reflexAdultCORTEXmedicine.medical_specialtyCognitive NeurosciencePostureCentral nervous systemORGANIZATIONCognitive neuroscienceGRANGER CAUSALITY050105 experimental psychology03 medical and health sciencesReflexMotor systemCOHERENCEBiological neural networkmedicineHumans0501 psychology and cognitive sciencesMuscle SkeletalSet (psychology)signal processing030304 developmental biologyIDENTIFICATIONElectromyographyPostural controlMotor controlINPUTSMUSCLE SYNERGIESBRAIN NETWORKSSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaNeuroscience030217 neurology & neurosurgery
researchProduct

Accelerating Causal Inference and Feature Selection Methods through G-Test Computation Reuse

2021

This article presents a novel and remarkably efficient method of computing the statistical G-test made possible by exploiting a connection with the fundamental elements of information theory: by writing the G statistic as a sum of joint entropy terms, its computation is decomposed into easily reusable partial results with no change in the resulting value. This method greatly improves the efficiency of applications that perform a series of G-tests on permutations of the same features, such as feature selection and causal inference applications because this decomposition allows for an intensive reuse of these partial results. The efficiency of this method is demonstrated by implementing it as…

Markov blanketMarkov blanketComputer sciencecomputation reuseConditional mutual informationComputationSciencePhysicsQC1-999QGeneral Physics and AstronomyContext (language use)Feature selectionInformation theoryAstrophysicsJoint entropyArticleG-testQB460-466feature selectionCausal inferencecausal inferenceAlgorithminformation theoryEntropy
researchProduct

Estimating the decomposition of predictive information in multivariate systems

2015

In the study of complex systems from observed multivariate time series, insight into the evolution of one system may be under investigation, which can be explained by the information storage of the system and the information transfer from other interacting systems. We present a framework for the model-free estimation of information storage and information transfer computed as the terms composing the predictive information about the target of a multivariate dynamical process. The approach tackles the curse of dimensionality employing a nonuniform embedding scheme that selects progressively, among the past components of the multivariate process, only those that contribute most, in terms of co…

Statistics and ProbabilityComputer scienceEntropyTRANSFER ENTROPYStochastic ProcesseInformation Storage and RetrievalheartAPPROXIMATE ENTROPYMaximum entropy spectral estimationInformation theoryGRANGER CAUSALITYJoint entropyNonlinear DynamicMECHANISMSBinary entropy functionTheoreticalHeart RateModelsInformationSLEEP EEGStatisticsOSCILLATIONSTOOLEntropy (information theory)Multivariate AnalysiElectroencephalography; Entropy; Heart Rate; Information Storage and Retrieval; Linear Models; Nonlinear Dynamics; Sleep; Stochastic Processes; Models Theoretical; Multivariate AnalysisConditional entropyStochastic ProcessesHEART-RATE-VARIABILITYCOMPLEXITYConditional mutual informationBrainElectroencephalographyModels TheoreticalScience GeneralCondensed Matter PhysicscardiorespiratoryNonlinear DynamicsPHYSIOLOGICAL TIME-SERIESSettore ING-INF/06 - Bioingegneria Elettronica E InformaticaMultivariate AnalysisLinear ModelsLinear ModelTransfer entropySleepAlgorithmStatistical and Nonlinear Physic
researchProduct

Functional connectivity inference from fMRI data using multivariate information measures

2022

Abstract Shannon’s entropy or an extension of Shannon’s entropy can be used to quantify information transmission between or among variables. Mutual information is the pair-wise information that captures nonlinear relationships between variables. It is more robust than linear correlation methods. Beyond mutual information, two generalizations are defined for multivariate distributions: interaction information or co-information and total correlation or multi-mutual information. In comparison to mutual information, interaction information and total correlation are underutilized and poorly studied in applied neuroscience research. Quantifying information flow between brain regions is not explic…

Brain MappingComputer scienceEntropyCognitive NeuroscienceConditional mutual informationBrainMultivariate normal distributionMutual informationcomputer.software_genreMagnetic Resonance ImagingInteraction informationRedundancy (information theory)Artificial IntelligenceEntropy (information theory)Computer SimulationTotal correlationInformation flow (information theory)Data miningcomputerNeural Networks
researchProduct